-
Notifications
You must be signed in to change notification settings - Fork 135
Update LLMInterface to restore LC compatibility #416
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
3475e0b
to
2c4f4e5
Compare
print(res.content) | ||
|
||
# If rate_limit_handler and async_rate_limit_handler decorators are used and you want to use a custom rate limit handler | ||
# Type variables for function signatures used in rate limit handlers |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the above comments on how to customise a rate limit handler are still valid no?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, actually only the first "If" is not needed anymore, as the rate limit handler is applied to all LLMs now. If reintroduced the rest of the comment.
c229aa3
to
d9c0f21
Compare
d9c0f21
to
7a4d4a0
Compare
Description
Restore LangChain chat models compatibility in the LLMInterface:
input
parameter for allinvoke
method now acceptsUnion[str, list[LLMMessage]]
, whereLLMMessage
is aTypedDict
with role/content keys.system_instructions
andmessage_history
is translated into alist[LLMMessage]
that is being passed down to the corresponding_invoke
method._invoke
methods.Type of Change
Complexity
Complexity: Low (many changes, but just refactoring in the end)
How Has This Been Tested?
Checklist
The following requirements should have been met (depending on the changes in the branch):